65 research outputs found

    Not-a-Bot (NAB): Improving Service Availability in the Face of Botnet Attacks

    Get PDF
    A large fraction of email spam, distributed denial-of-service (DDoS) attacks, and click-fraud on web advertisements are caused by traffic sent from compromised machines that form botnets. This paper posits that by identifying human-generated traffic as such, one can service it with improved reliability or higher priority, mitigating the effects of botnet attacks. The key challenge is to identify human-generated traffic in the absence of strong unique identities. We develop NAB (``Not-A-Bot''), a system to approximately identify and certify human-generated activity. NAB uses a small trusted software component called an attester, which runs on the client machine with an untrusted OS and applications. The attester tags each request with an attestation if the request is made within a small amount of time of legitimate keyboard or mouse activity. The remote entity serving the request sends the request and attestation to a verifier, which checks the attestation and implements an application-specific policy for attested requests. Our implementation of the attester is within the Xen hypervisor. By analyzing traces of keyboard and mouse activity from 328 users at Intel, together with adversarial traces of spam, DDoS, and click-fraud activity, we estimate that NAB reduces the amount of spam that currently passes through a tuned spam filter by more than 92%, while not flagging any legitimate email as spam. NAB delivers similar benefits to legitimate requests under DDoS and click-fraud attacks

    The role of coding in the choice between routing and coding for wireless unicast

    Get PDF
    International audienceWe consider the benefits of coding in wireless networks, specifically its role in exploiting the local broadcast property of the wireless medium. We first argue that for unicast, the throughput achieved with network coding is the same as that achieved without any coding. This argument highlights the role of a general max-flow min-cut duality and is more explicit than previous proofs. The maximum throughput can be achieved in multiple ways without any coding, for example, using backpressure routing, or using some centralized flow scheduler that is aware of the network topology. However, all such schemes, in order to take advantage of the local broadcast property, require dynamic routing decisions for choosing the next hop for each packet from among the nodes where it is successfully received. This choice seems to depend critically on feedback signaling information like queue lengths, or ARQ. In contrast, note that the use of network coding can achieve the same without such feedback, in exchange for decoding overhead. A key issue to be resolved in making a comparison between routing and coding would be how critical feedback signaling is, for the throughput of routing policies. With this motivation, we first explore how feedback at a given node affects its throughput, with arbitrary rates of its one-hop neighbors to the destination. Static routing policies which are essentially feedback independent , are considered. An explicit characterization of the optimal policies under such a feedback constraint is obtained, which turns out to be a natural generalization of both flooding and traditional routing (which does not exploit local broadcast, because the next hop is fixed prior to the transmission). When losses at the receivers are independent (still allowing for dependencies on transmissions by two different nodes, to model interference), the reduction in capacity due to constraining the feedback is limited to a constant fraction (e−1=37%) of the coding capacity, and gets arbitrarily close to optimal as the unconstrained capacity goes to zero. We also extend this analysis to a layered multihop network and also compare the throughput of flooding to backpressure via simulations for a layered network assuming independent losses. Finally, if there are dependencies in the losses seen by receivers from a single broadcast, the reduction could be arbitrarily bad, even with just two hops

    Clicktok : click fraud detection using traffic analysis

    Get PDF
    Advertising is a primary means for revenue generation for millions of websites and smartphone apps. Naturally, a fraction abuse ad networks to systematically defraud advertisers of their money. Modern defences have matured to overcome some forms of click fraud but measurement studies have reported that a third of clicks supplied by ad networks could be clickspam. Our work develops novel inference techniques which can isolate click fraud attacks using their fundamental properties.We propose two defences, mimicry and bait-click, which provide clickspam detection with substantially improved results over current approaches. Mimicry leverages the observation that organic clickfraud involves the reuse of legitimate click traffic, and thus isolates clickspam by detecting patterns of click reuse within ad network clickstreams. The bait-click defence leverages the vantage point of an ad network to inject a pattern of bait clicks into a user's device. Any organic clickspam generated involving the bait clicks will be subsequently recognisable by the ad network. Our experiments show that the mimicry defence detects around 81% of fake clicks in stealthy (low rate) attacks, with a false-positive rate of 110 per hundred thousand clicks. Similarly, the bait-click defence enables further improvements in detection, with rates of 95% and a reduction in false-positive rates of between 0 and 30 clicks per million - a substantial improvement over current approaches

    Repeated auctions under budget constraints: Optimal bidding strategies and equilibria. SSRN working draft

    No full text
    How should agents bid in repeated sequential auctions when they are budget constrained? A motivating example is that of sponsored search auctions, where advertisers bid in a sequence of generalized second price (GSP) auctions. These auctions, specifically in the context of sponsored search, have many idiosyncratic features that distinguish them from other models of sequential auctions: First, each bidder competes in a large number of auctions, where each auction is worth very little. Second, the total bidder population is often large, which means it is unrealistic to assume that the bidders could possibly optimize their strategy by modeling specific opponents. Third, the presence of a virtually unlimited supply of these auctions means bidders are necessarily expense constrained. Motivated by these three factors, we first frame the generic problem as a discounted Markov Decision Process for which the environment is independent and identically distributed over time. We also allow the agents to receive income to augment their budget at a constant rate. We first provide a structural characterization of the associated value function and the optimal bidding strategy, which specifies the extent to which agents underbid from their true valuation due to long term budget constraints. We then provide an explicit characterization of the optimal bid shading factor in the limiting regime where the discount rate tends to zero, by identifying the limit of the value function in terms of the solution to a differential equation that can be solved efficiently. Finally, we proved the existence of Mean Field Equilibria for both the repeated second price and GSP auctions with a large number of bidders

    Reliable and efficient programming abstractions for wireless sensor networks

    No full text
    It is currently difficult to build practical and reliable programming systems out of distributed and resource-constrained sensor devices. The state of the art in today’s sensornet programming is centered around a component-based language called nesC. nesC is a nodelevel language—a program is written for an individual node in the network—and nesC programs use the services of an operating system called TinyOS. We are pursuing an approach to programming sensor networks that significantly raises the level of abstraction over this practice. The critical change is one of perspective: rather than writing programs from the point of view of an individual node, programmers implement a central program that conceptually has access to the entire network. This approach pushes to the compiler the task of producing node-level programs that implement the desired behavior. We present the Pleiades programming language, its compiler
    • …
    corecore